West Coast — definition
1. West Coast (Noun)
1 definition
West Coast (Noun) — The western seaboard of the United States from Washington to southern California.
4 types of
geographic area geographic region geographical area geographical region
2 parts of
West western United States